immune repertoire classification
- North America > Canada > British Columbia > Metro Vancouver Regional District > Vancouver (0.24)
- North America > United States (0.14)
- Europe > Norway > Eastern Norway > Oslo (0.04)
- (2 more...)
Modern Hopfield Networks and Attention for Immune Repertoire Classification
A central mechanism in machine learning is to identify, store, and recognize patterns. How to learn, access, and retrieve such patterns is crucial in Hopfield networks and the more recent transformer architectures. We show that the attention mechanism of transformer architectures is actually the update rule of modern Hopfield networks that can store exponentially many patterns. We exploit this high storage capacity of modern Hopfield networks to solve a challenging multiple instance learning (MIL) problem in computational biology: immune repertoire classification. In immune repertoire classification, a vast number of immune receptors are used to predict the immune status of an individual. This constitutes a MIL problem with an unprecedentedly massive number of instances, two orders of magnitude larger than currently considered problems, and with an extremely low witness rate. Accurate and interpretable machine learning methods solving this problem could pave the way towards new vaccines and therapies, which is currently a very relevant research topic intensified by the COVID-19 crisis.
- Health & Medicine > Therapeutic Area > Vaccines (0.59)
- Health & Medicine > Therapeutic Area > Infections and Infectious Diseases (0.59)
- Health & Medicine > Therapeutic Area > Immunology (0.59)
- North America > United States (0.14)
- Europe > Norway > Eastern Norway > Oslo (0.04)
- North America > Canada (0.04)
- (2 more...)
- Health & Medicine > Therapeutic Area > Infections and Infectious Diseases (1.00)
- Health & Medicine > Therapeutic Area > Immunology (1.00)
- Health & Medicine > Pharmaceuticals & Biotechnology (1.00)
- Government (0.68)
Review for NeurIPS paper: Modern Hopfield Networks and Attention for Immune Repertoire Classification
Weaknesses: This manuscript contains a highly theoretical analysis of modern Hopfield networks and their relationship to the attention mechanism of a transformer model. It also contains a deep model that addresses the machine learning task of immune repertoire classification. The major issue with this submission is that the connection between the two topics addressed in this paper, (i) classification of immune repertoires, and (ii) equivalence of the update rule of modern Hopfield networks and the attention mechanism of the transformer, is at best unclear. It feels as though two distinct papers have been condensed into one. Overall, combining these two results into one paper results in a main text manuscript that does not provide sufficient detail about either.
Modern Hopfield Networks and Attention for Immune Repertoire Classification
A central mechanism in machine learning is to identify, store, and recognize patterns. How to learn, access, and retrieve such patterns is crucial in Hopfield networks and the more recent transformer architectures. We show that the attention mechanism of transformer architectures is actually the update rule of modern Hopfield networks that can store exponentially many patterns. We exploit this high storage capacity of modern Hopfield networks to solve a challenging multiple instance learning (MIL) problem in computational biology: immune repertoire classification. In immune repertoire classification, a vast number of immune receptors are used to predict the immune status of an individual.
Modern Hopfield Networks and Attention for Immune Repertoire Classification
Widrich, Michael, Schäfl, Bernhard, Ramsauer, Hubert, Pavlović, Milena, Gruber, Lukas, Holzleitner, Markus, Brandstetter, Johannes, Sandve, Geir Kjetil, Greiff, Victor, Hochreiter, Sepp, Klambauer, Günter
A central mechanism in machine learning is to identify, store, and recognize patterns. How to learn, access, and retrieve such patterns is crucial in Hopfield networks and the more recent transformer architectures. We show that the attention mechanism of transformer architectures is actually the update rule of modern Hopfield networks that can store exponentially many patterns. We exploit this high storage capacity of modern Hopfield networks to solve a challenging multiple instance learning (MIL) problem in computational biology: immune repertoire classification. Accurate and interpretable machine learning methods solving this problem could pave the way towards new vaccines and therapies, which is currently a very relevant research topic intensified by the COVID-19 crisis. Immune repertoire classification based on the vast number of immunosequences of an individual is a MIL problem with an unprecedentedly massive number of instances, two orders of magnitude larger than currently considered problems, and with an extremely low witness rate. In this work, we present our novel method DeepRC that integrates transformer-like attention, or equivalently modern Hopfield networks, into deep learning architectures for massive MIL such as immune repertoire classification. We demonstrate that DeepRC outperforms all other methods with respect to predictive performance on large-scale experiments, including simulated and real-world virus infection data, and enables the extraction of sequence motifs that are connected to a given disease class. Source code and datasets: https://github.com/ml-jku/DeepRC
- North America > United States (0.46)
- Europe > Norway > Eastern Norway > Oslo (0.04)
- Europe > Austria > Upper Austria > Linz (0.04)
- Asia > Middle East > Jordan (0.04)
- Research Report > Experimental Study (0.46)
- Research Report > Promising Solution (0.34)